Skip to content

Update RMSNorm.forward to accept 2D or 3D input shape#1221

Merged
jlarson4 merged 1 commit intoTransformerLensOrg:devfrom
brendanlong:brendanlong/gemma3-attention-shape
Apr 3, 2026
Merged

Update RMSNorm.forward to accept 2D or 3D input shape#1221
jlarson4 merged 1 commit intoTransformerLensOrg:devfrom
brendanlong:brendanlong/gemma3-attention-shape

Conversation

@brendanlong
Copy link
Copy Markdown
Contributor

@brendanlong brendanlong commented Mar 29, 2026

Description

RMSNorm.forward() has a jaxtyping hint expecting a 3D tensor (batch, pos, length), but _apply_qk_norm was reshaping to 2D (batchposheads, d_head), causing a BeartypeCallHintParamViolation on Gemma 3 models.

RMSNorm only operators on the last dimension, so this updates the type to accept 2D as well as 3D.

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)

Checklist:

  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I have not rewritten tests relating to key interfaces which would affect backward compatibility

Note: I have new tests I'm working on in brendanlong#8 that failed due to this

@jlarson4 jlarson4 changed the base branch from main to dev March 31, 2026 16:10
@jlarson4
Copy link
Copy Markdown
Collaborator

jlarson4 commented Apr 2, 2026

A non-breaking alternative solution to this problem would be to adjust the type hint, to allow for this case of a 2D caller. I'd recommend establishing a custom Union type of RMSNormInput, which you can then use to type the input & output of RMSNorm.forward.

RMSNorm.forward() has a jaxtyping hint expecting a 3D tensor (batch,
pos, length), but _apply_qk_norm was reshaping to 2D (batch*pos*heads,
d_head), causing a BeartypeCallHintParamViolation on Gemma 3 models.

Update the type hint to allow 3D since the code only cares about
the last dimension.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@brendanlong brendanlong force-pushed the brendanlong/gemma3-attention-shape branch from 378af45 to 4f7c0c2 Compare April 3, 2026 04:35
@brendanlong brendanlong changed the title Fix QK norm reshape to match RMSNorm's expected 3D input shape Update RMSNorm.forward to accept 2D or 3D input shape Apr 3, 2026
@brendanlong
Copy link
Copy Markdown
Contributor Author

@jlarson4 I updated it to take that approach instead.

@jlarson4 jlarson4 merged commit 30b6a74 into TransformerLensOrg:dev Apr 3, 2026
13 checks passed
@brendanlong brendanlong deleted the brendanlong/gemma3-attention-shape branch April 3, 2026 11:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants